Phase diagram of restricted Boltzmann machines and generalized Hopfield networks with arbitrary priors.
نویسندگان
چکیده
Restricted Boltzmann machines are described by the Gibbs measure of a bipartite spin glass, which in turn can be seen as a generalized Hopfield network. This equivalence allows us to characterize the state of these systems in terms of their retrieval capabilities, both at low and high load, of pure states. We study the paramagnetic-spin glass and the spin glass-retrieval phase transitions, as the pattern (i.e., weight) distribution and spin (i.e., unit) priors vary smoothly from Gaussian real variables to Boolean discrete variables. Our analysis shows that the presence of a retrieval phase is robust and not peculiar to the standard Hopfield model with Boolean patterns. The retrieval region becomes larger when the pattern entries and retrieval units get more peaked and, conversely, when the hidden units acquire a broader prior and therefore have a stronger response to high fields. Moreover, at low load retrieval always exists below some critical temperature, for every pattern distribution ranging from the Boolean to the Gaussian case.
منابع مشابه
Phase transitions in Restricted Boltzmann Machines with generic priors
We study generalized restricted Boltzmann machines with generic priors for units and weights, interpolating between Boolean and Gaussian variables. We present a complete analysis of the replica symmetric phase diagram of these systems, which can be regarded as generalized Hopfield models. We underline the role of the retrieval phase for both inference and learning processes and we show that ret...
متن کاملStatistical Physics and Representations in Real and Artificial Neural Networks
This document presents the material of two lectures on statistical physics and neural representations, delivered by one of us (R.M.) at the Fundamental Problems in Statistical Physics XIV summer school in July 2017. In a first part, we consider the neural representations of space (maps) in the hippocampus. We introduce an extension of the Hopfield model, able to store multiple spatial maps as c...
متن کاملOn the equivalence of Hopfield networks and Boltzmann Machines
A specific type of neural networks, the Restricted Boltzmann Machines (RBM), are implemented for classification and feature detection in machine learning. They are characterized by separate layers of visible and hidden units, which are able to learn efficiently a generative model of the observed data. We study a "hybrid" version of RBMs, in which hidden units are analog and visible units are bi...
متن کاملAdvances in Deep Learning
Deep neural networks have become increasingly more popular under the name of deep learning recently due to their success in challenging machine learning tasks. Although the popularity is mainly due to the recent successes, the history of neural networks goes as far back as 1958 when Rosenblatt presented a perceptron learning algorithm. Since then, various kinds of artificial neural networks hav...
متن کاملFoundations and Advances in Deep Learning
Aalto University, P.O. Box 11000, FI-00076 Aalto www.aalto.fi Author Kyunghyun Cho Name of the doctoral dissertation Foundations and Advances in Deep Learning Publisher Unit Department of Information and Computer Science Series Aalto University publication series DOCTORAL DISSERTATIONS 21/2014 Field of research Machine Learning Manuscript submitted 2 September 2013 Date of the defence 21 March ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Physical review. E
دوره 97 2-1 شماره
صفحات -
تاریخ انتشار 2018